Avoiding Coding Tricks by Hyperrobust Learning

نویسندگان

  • Matthias Ott
  • Frank Stephan
چکیده

The present work introduces and justiies the notion of hy-perrobust learning where one xed learner has to learn all functions in a given class plus their images under primitive recursive operators. The following is shown: This notion of learnability does not change if the class of primitive recursive operators is replaced by a larger enumerable class of operators. A class is hyperrobustly Ex-learnable ii it is a subclass of a recursively enumerable family of total functions. So, the notion of hyperrobust learning overcomes a problem of the traditional deenitions of robustness which either do not preserve learning by enumeration or still permit topological coding tricks for the learning criterion Ex. Hy-perrobust BC-learning as well as the hyperrobust version of Ex-learning by teams are more powerful than hyperrobust Ex-learning. The notion of bounded totally reliable BC-learning is properly between hyperrobust Ex-learning and hyperrobust BC-learning. Furthermore, the bounded totally reliably BC-learnable classes are characterized in terms of inn-nite branches of certain enumerable families of bounded recursive trees. A class of innnite branches of a further family of trees separates hyper-robust BC-learning from totally reliable BC-learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Models of Cooperative Teaching and Learning

While most supervised machine learning models assume that training examples are sampled at random or adversarially, this article is concerned with models of learning from a cooperative teacher that selects “helpful” training examples. The number of training examples a learner needs for identifying a concept in a given class C of possible target concepts (sample complexity of C) is lower in mode...

متن کامل

Teaching Dimensions based on Cooperative Learning

The problem of how a teacher and a learner can cooperate in the process of learning concepts from examples in order to minimize the required sample size without “coding tricks” has been widely addressed, yet without achieving teaching and learning protocols that meet what seems intuitively an optimal choice for selecting samples in teaching. We introduce the model of subset teaching sets, based...

متن کامل

Learning without Coding

Iterative learning is a model of language learning from positive data, due to Wiehagen. When compared to a learner in Gold’s original model of language learning from positive data, an iterative learner can be thought of as memorylimited . However, an iterative learner can memorize some input elements by coding them into the syntax of its hypotheses. A main concern of this paper is: to what exte...

متن کامل

Asynchronous COMID: the theoretic basis for transmitted data sparsification tricks on Parameter Server

Asynchronous FTRL and L2 norm done at server are two widely used tricks to improve training efficiency, but their convergences are not well-proved. In this paper, we propose asynchronous COMID algorithm and prove its convergence. Then, we establish the equivalence between asynchronous COMID and the above two tricks. Thus, the convergences of above two tricks are also proved. Experimental result...

متن کامل

Studying Workplace Learning: Case Study

The present research aims to study workplace learning in the Petrochemical Industries National Company. The research is of qualitative nature and uses grounded theory approach, introduced by Charmaz. Data collection methods included interviews, analysis of documents, observation and focus groups. The statistical population of the research consisted of 350 employees, 16 of whom were selected thr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999